home *** CD-ROM | disk | FTP | other *** search
- Path: sable.ox.ac.uk!mert0053
- From: mert0053@sable.ox.ac.uk (Michael Brewer)
- Newsgroups: comp.lang.c++
- Subject: C++ Streams and System File Limit
- Date: 10 Jan 1996 09:43:15 GMT
- Organization: Oxford University, England
- Message-ID: <4d01nj$ov2@news.ox.ac.uk>
- NNTP-Posting-Host: sable.ox.ac.uk
- X-Newsreader: TIN [version 1.2 PL2]
-
- When a user's per process limit of open files is reached, and a new
- fstream is opened, will the stream's failbit be set? I am getting
- segmentation faults at odd times at present, and this problem
- disappears when I reduce the number of open files the program
- maintains. In fact, most of the open files are created by a library
- class that I am using that uses C file routines (i.e. fopen). However,
- a test IS made if the file was opened correctly, and I am checking all
- of my fstreams for failure after opening. No errors are reported, the
- program just dies with segmentation fault (not at the time of the
- first use of the stream either). However, when I reduce the number of
- objects of the library class (which in turn reduces the number of open
- files), there are no segmentation faults.
-
- Is it possible for a program to fail like this for the reasons I am
- suggesting? Or is it likely that there is something more fundamentally
- wrong with my code? By the way, I have of the order of 95-100 files open
- when problems occcur.
-
- Thanks for your help!
-
- Mike.
-